A Large - Sample Model Selection CriterionBased on Kullback ' s Symmetric
نویسنده
چکیده
The Akaike information criterion, AIC, is a widely known and extensively used tool for statistical model selection. AIC serves as an asymptotically unbiased estimator of a variant of Kullback's directed divergence between the true model and a tted approximating model. The directed divergence is an asymmetric measure of separation between two statistical models, meaning that an alternate directed divergence may be obtained by reversing the roles of the two models in the deenition of the measure. The sum of the two directed divergences is Kullback's symmetric divergence. Since the symmetric divergence combines the information in two related though distinct measures, it functions as a gauge of model disparity which is arguably more sensitive than either of its individual components. With this motivation, we propose a model selection criterion which serves as an asymptotically unbiased estimator of a variant of the symmetric divergence between the true model and a tted approximating model. We examine the performance of the criterion relative to other well-known criteria in a simulation study.
منابع مشابه
IMPROVING THE SELECTION SYMMETRIC WEIGHTS AS A SECONDARY GOAL IN DEA CROSS-EFFICIENCY EVALUATION
Recently, some authors proposed the use of symmetric weightsfor computing the elements of cross-efficiency matrix. In spite ofthe fact that the proposed method decreases the number of zeroweights, a large number of zero weights may still exist among inputand output symmetric weights. To decrease the number of input andoutput symmetric weights, this paper improves the proposed secondarygoal mode...
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملA Bootstrap Model Selection Criterion Based on Kullback’s Symmetric Divergence
Following on recent work of [Cavanaugh, 1999] and [Seghouane, 2002], we propose a new corrected variant of KIC develop for the purpose of sources separation. Our variant utilizes bootstrapping to provide an estimate of the expected Kullback-Leibler symmetric divergence between the model generating the data and a fitted approximating model. Simulation results that illustrate the performance of t...
متن کاملCriteria for longitudinal data model selection based on Kullback's symmetric divergence
Recently, Azari et al (2006) showed that (AIC) criterion and its corrected versions cannot be directly applied to model selection for longitudinal data with correlated errors. They proposed two model selection criteria, AICc and RICc, by applying likelihood and residual likelihood approaches. These two criteria are estimators of the Kullback-Leibler’s divergence distance which is asymmetric. In...
متن کاملUnderstanding Information Criteria for Selection among Capture-recapture or Ring Recovery Models
We provide background information to allow a heuristic understanding of two types of criteria used in selecting a model for making inferences from ringing data. The first type of criteria (e.g., AIC, AIC, QAIC and TIC) are estimates of (relative) Kullback-Leibler information or-distance and attempt to select a good approximating model for inference, based on the Principle of Parsimony. The seco...
متن کامل